Skip to content

Refactor default system prompt to be closer to LLM client creation#106

Open
jgieringer wants to merge 3 commits intomainfrom
jgieringer/default-sys-prompt
Open

Refactor default system prompt to be closer to LLM client creation#106
jgieringer wants to merge 3 commits intomainfrom
jgieringer/default-sys-prompt

Conversation

@jgieringer
Copy link
Collaborator

Description

This PR un-buries the default system prompt. Now all LLM clients have the default of "You are a helpful AI assistant." unless overridden.

Issue

Resolves comment: https://github.com/SpringCare/VERA-MH/pull/104/changes/BASE..d4debf20595196156b6bc49c7608675379b5dd85#r2796492477

@jgieringer jgieringer changed the base branch from main to jgieringer/convo-id February 12, 2026 17:53
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR standardizes the default system prompt across all LLM clients by moving the default into LLMInterface, so callers that omit system_prompt now get "You are a helpful AI assistant." automatically (unless they explicitly pass an empty string to disable it).

Changes:

  • Introduces DEFAULT_SYSTEM_PROMPT in LLMInterface and applies it when system_prompt is None (preserving "" as “no prompt”).
  • Updates ConversationRunner to pass through system_prompt from config without injecting its own fallback string.
  • Updates unit/integration tests and docs to reflect the new default behavior and the “explicit empty string disables prompt” semantics.

Reviewed changes

Copilot reviewed 10 out of 10 changed files in this pull request and generated 1 comment.

Show a summary per file
File Description
llm_clients/llm_interface.py Adds DEFAULT_SYSTEM_PROMPT and changes initialization semantics to default on None (not on falsy).
generate_conversations/runner.py Removes runner-level default prompt injection and relies on LLMInterface defaulting.
tests/integration/test_conversation_runner.py Adjusts integration expectations: runner passes None, base class applies default.
tests/unit/llm_clients/test_llm_interface.py Adds coverage for default-on-None and preserving explicit empty string.
tests/unit/llm_clients/test_openai_llm.py Updates “no system prompt” test to pass system_prompt="".
tests/unit/llm_clients/test_claude_llm.py Updates “no system prompt” test to pass system_prompt="".
tests/unit/llm_clients/test_gemini_llm.py Updates “no system prompt” test to pass system_prompt="".
tests/unit/llm_clients/test_azure_llm.py Updates “no system prompt” test to pass system_prompt="".
tests/unit/llm_clients/test_ollama_llm.py Updates several tests to explicitly disable system prompts where needed; asserts default prompt when None.
docs/evaluating.md Clarifies conversation history flow and refines conversation_id wording.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Base automatically changed from jgieringer/convo-id to main February 12, 2026 23:38
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 11 out of 11 changed files in this pull request and generated no new comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@emily-vanark
Copy link
Collaborator

emily-vanark commented Feb 19, 2026

So... if I want to run an API endpoint without a system prompt... do I just leave that out of my generate response?

@jgieringer
Copy link
Collaborator Author

So... if I want to run an API endpoint without a system prompt... do I just leave that out of my generate response?

Yes! Since all of our work up-to-now has been on raw LLM calls, system prompt has been required.

I did something similar with start_prompt in the custom api LLM client where I set start_prompt = None if the _start_url is not None just so start_prompt wouldn't end up in the metadata. This can definitely be cleaner, but since it's an example, I wanted to ship and improve over time if really needed.

I could add another level of abstraction that is more for raw LLM clients that require system messages and such, but I wonder if it's too much?

cc @emily-vanark

@sator-labs
Copy link
Collaborator

So... if I want to run an API endpoint without a system prompt... do I just leave that out of my generate response?

I might not be understanding what's happening, but if you don't include one, will evert to the default?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants